Markov and Chebychev Inequalities, and the Weak Law of Large Numbers

Muhammad Haris Rao


Lemma: Let $f : \mathbb{R}_{\ge 0} \longrightarrow \mathbb{R}_{\ge 0}$ be an increasing measurable function, and $X$ a non-negative random variable. Then if $a \ge 0$ with $f(a) > 0$, $$ \mathbf{P} \left( X \ge a \right) \le \frac{\textbf{E}[f(X)]}{f(a)} $$

Proof. Notice that $\mathbf{1} \{ X \ge a \} \le f(X)/ f(a)$. Because if $X < a$ then $$ \mathbf{1} \{ X \ge a \} = 0 \le \frac{\textbf{E}[f(X)]}{f(a)} $$ and if $X \ge a$ then $$ \mathbf{1} \{ X \ge a \} = 1 = \frac{f(a)}{f(a)} \le \frac{f(X)}{f(a)} $$ by monotonicity of $f$. So then $$ \mathbf{P} \left( X \ge a \right) = \textbf{E} \left[ \mathbf{1} \{ X \ge a \} \right] \le \mathbf{E} \left[ \frac{f(X)}{f(a)} \right] = \frac{\textbf{E}[f(X)]}{f(a)} $$ $\space$ $\blacksquare$

Taking the $f$ in the statement above to be the identity function gives

Corollary (Markov Inequality): If $X \ge 0$ and $a > 0$ then $$ \mathbf{P} \left( X \ge a \right) \le \frac{\textbf{E}[X]}{a} $$

Likewise, taking $f$ to be the function $x \longmapsto x^2$ and applying to the non-negative random variable $|X - \mathbf{E}[X]| \ge 0$ for a given integrable random variable $X$ gives

Corollary (Chebychev Inequality): If $X$ is an integrable random variable and $a > 0$, then $$ \mathbf{P} \left( |X - \mathbf{E}[X]| \ge a \right) \le \frac{\text{Var} \left[ X \right]}{a^2} $$

Weak Law of Large Numbers


These inequalities give a simple proof of the weak law of large numbers under the condition of a finite second moment.

Theorem(Weak Law of Large Numbers): If $\{ X_n \}_{n \ge 1}$ is an i.i.d sequence of random variables, each with finite second moment and expectation $\mu$, then $X_n \overset{p}{\longrightarrow} \mu$ as $n \to \infty$. That is, for every $\varepsilon > 0$, $$ \lim_{n \to \infty} \mathbf{P} \left( \left| \frac{1}{n} \sum_{k = 1}^n X_k - \mu \right| \ge \varepsilon \right) = 0 $$

Proof. If $\varepsilon > 0$, then by Chebychev's inequaltiy, $$ \mathbf{P} \left( \left| \frac{1}{n} \sum_{k = 1}^n X_k - \mu \right| \ge \varepsilon \right) \le \frac{\text{Var} \left[ \frac{1}{n} \sum_{k = 1}^n X_k \right]}{\varepsilon^2} = \frac{\sum_{k = 1}^n \text{Var}[X_k]}{n^2 \varepsilon^2} = \frac{\text{Var}[X_1]}{n \varepsilon^2} \longrightarrow 0 \text{ as $n \to \infty$} $$ So we have convergence in probability $X_n \overset{p}{\longrightarrow} \mu$ as $n \to \infty$. $\blacksquare$